Advanced Approximation Algorithms ( CMU 18 - 854 B , Spring 2008 ) Lecture 16 : Gaps for Max - Cut Mar 6 , 2008
نویسندگان
چکیده
Sdp(G) ≥ Opt(G) ≥ AlgGW (G) Also, we know the following as well, Theorem 1.1. [2] For all graphsG, AlgGW (G) ≥ αGWSdp(G), where αGW is a constant≈ 0.878. But this only gives us a ratio comparing Sdp(G) and AlgGW (G), and is thus a worry in the following sense: If the graph is such that OPT (G) ≈ 0.5, the Goemans Williamson algorithm could actually return a cut of size ≤ 0.5 (if Sdp(G) were also nearly 0.5), performing worse than any random cut. This leads us to the following questions:
منابع مشابه
Advanced Approximation Algorithms ( CMU 18 - 854 B , Spring 2008 ) Lecture 27 : Algorithms for Sparsest Cut Apr 24 , 2008
In lecture 19, we saw an LP relaxation based algorithm to solve the sparsest cut problem with an approximation guarantee of O(logn). In this lecture, we will show that the integrality gap of the LP relaxation is O(logn) and hence this is the best approximation factor one can get via the LP relaxation. We will also start developing an SDP relaxation based algorithm which provides an O( √ log n) ...
متن کاملAdvanced Approximation Algorithms ( CMU 15 - 854 B , Spring 2008 ) Lecture 25 : Hardness of Max - k CSPs and Max - Independent - Set
Recall that an instance of the Max-kCSP problem is a collection of constraints, each of which is defined over k variables. A random assignment to the variables in a constraint satisfies it with probability 1/2, so a random assignment satisfies a 1/2 fraction of the constraints in a Max-kCSP instance in expectation. This shows that the hardness result is close to optimal, since there is a trivia...
متن کاملAdvanced Approximation Algorithms ( CMU 15 - 854 B , Spring 2008 ) Lecture 22 : Hardness of Max - E 3 Lin April 3 , 2008
This lecture is beginning the proof of 1 − ǫ vs. 1/2 + ǫ hardness of Max-E3Lin. The basic idea is similar to previous reductions; reduce from Label-Cover using a gadget that creates 2 variables corresponding to the key vertices and 2 vertices corresponding to the label vertices, where they correspond in the usual way to {0, 1} and {0, 1}. We then want to select some subsets x, y, z of these str...
متن کاملAdvanced Approximation Algorithms ( CMU 18 - 854 B , Spring 2008 ) Lecture 6 : Facility location : greedy and local search algorithms
Recall the greedy algorithm for non-metric facility location that was done in Lecture 4, which was obtained by formulating the problem as set cover. In this section we present a modified greedy algorithm for the metric facility location problem that achieves a constant approximation ratio. Let the facility location instance consist of clients D, facilities F , metric d on D ∪ F , and facility o...
متن کاملAdvanced Approximation Algorithms ( CMU 15 - 854 B , Spring 2008 ) Lecture 10 : Group Steiner Tree problem
We will be studying the Group Steiner tree problem in this lecture. Recall that the classical Steiner tree problem is the following. Given a weighted graphG = (V,E), a subset S ⊆ V of the vertices, and a root r ∈ V , we want to find a minimum weight tree which connects all the vertices in S to r. The weights on the edges are assumed to be positive. We will now define the Group Steiner tree prob...
متن کامل